You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Garri Santos <ga...@phpugph.com> on 2008/04/17 11:10:44 UTC

getting files in hdfs

Good Day,

I successfully installed and copy a test file to HDFS. I was wondering if is
it possible to directly access the file without getting it out first from
the HDFS.


Regards,
Garri

Re: getting files in hdfs

Posted by Garri Santos <ga...@phpugph.com>.
Thanks Ted for the clarification. That's a great approach.

Garri

On Fri, Apr 18, 2008 at 10:15 AM, Ted Dunning <td...@veoh.com> wrote:

>
> That isn't (I don't think) what I was referring to.
>
> I meant http://namenode:port/data/path-name
>
> This gives you the entire file at a time.
>
>
> On 4/17/08 6:08 PM, "Garri Santos" <ga...@phpugph.com> wrote:
>
> > @Ted
> >
> > yeah I explored that Browse file system and I think I can use that.
> >
> > Is it possible to mount the /user/hadoop?
>
>


-- 
__________________________
Garrizaldy R. Santos
Ubraa Developer
PHP User-Group Philippines Inc.
http://www.phpugph.com
garrizaldy.santos@phpugph.com

Re: getting files in hdfs

Posted by Ted Dunning <td...@veoh.com>.
That isn't (I don't think) what I was referring to.

I meant http://namenode:port/data/path-name

This gives you the entire file at a time.


On 4/17/08 6:08 PM, "Garri Santos" <ga...@phpugph.com> wrote:

> @Ted
> 
> yeah I explored that Browse file system and I think I can use that.
> 
> Is it possible to mount the /user/hadoop?


Re: getting files in hdfs

Posted by Garri Santos <ga...@phpugph.com>.
Thanks Thomas and Ted,

when I use:

${PATH_TO_HADOOP_INSTALL}/bin/hadoop dfs -ls

it list:

/user/hadoop/sample.txt

I was wondering if how can I access that file from let's say a web interface
to direct on that file.

<!-- Like it was just mounted -->
<a href="/user/hadoop/sample.txt">Sample Text</a>

AFAIK the given example above will not work he he he

In order for me to access the file I have to -get sample.txt
/path/to/webroot/ the file and put it to my webroot and creating a the link
that will point to the file retrieve. I was thinking of other ways or
possibilities to bypass that process.

@Ted

yeah I explored that Browse file system and I think I can use that.

Is it possible to mount the /user/hadoop?


Thank you very much,
Garri



On Thu, Apr 17, 2008 at 11:52 PM, Ted Dunning <td...@veoh.com> wrote:

>
> You can also get to the file via HTTP.
>
>
> On 4/17/08 2:43 AM, "Thomas Thevis" <Th...@semgine.com> wrote:
>
> > What do you mean by 'directly access the file'? HDFS provides several
> > file operations. Type '${PATH_TO_HADOOP_INSTALL}/bin/hadoop fs' to see
> > an appropriate usage message.
> >
> > Regards,
> > Thomas
> >
> > Garri Santos schrieb:
> >> Good Day,
> >>
> >> I successfully installed and copy a test file to HDFS. I was wondering
> if is
> >> it possible to directly access the file without getting it out first
> from
> >> the HDFS.
> >>
> >>
> >> Regards,
> >> Garri
> >>
>
>

Re: getting files in hdfs

Posted by Ted Dunning <td...@veoh.com>.
You can also get to the file via HTTP.


On 4/17/08 2:43 AM, "Thomas Thevis" <Th...@semgine.com> wrote:

> What do you mean by 'directly access the file'? HDFS provides several
> file operations. Type '${PATH_TO_HADOOP_INSTALL}/bin/hadoop fs' to see
> an appropriate usage message.
> 
> Regards,
> Thomas
> 
> Garri Santos schrieb:
>> Good Day,
>> 
>> I successfully installed and copy a test file to HDFS. I was wondering if is
>> it possible to directly access the file without getting it out first from
>> the HDFS.
>> 
>> 
>> Regards,
>> Garri
>> 


Re: getting files in hdfs

Posted by Thomas Thevis <Th...@semgine.com>.
What do you mean by 'directly access the file'? HDFS provides several 
file operations. Type '${PATH_TO_HADOOP_INSTALL}/bin/hadoop fs' to see 
an appropriate usage message.

Regards,
Thomas

Garri Santos schrieb:
> Good Day,
> 
> I successfully installed and copy a test file to HDFS. I was wondering if is
> it possible to directly access the file without getting it out first from
> the HDFS.
> 
> 
> Regards,
> Garri
>