You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Sugandha Naolekar <su...@gmail.com> on 2009/06/05 09:31:13 UTC

Few Queries..!!!

Hello!

I am have following queries related to Hadoop::

-> Once I place my data in HDFS, it gets replicated and chunked
automatically over the datanodes. Right? Hadoop takes care of all those
things.

-> Now, if there is some third party who is not participating in the Hadoop
program. Means, he is not one of the nodes of hadoop cluster. Now, he has
some data on his local filesystem. Thus, can I place this data into HDFS?
How?

-> Then, now when, that third party asks for  a file or a direcory or any
kind of data that was previously being dumped in HDFS without that third
person's knowledge- he wnats it back(wants to retrieve it). Thus, the data
should get placed on his local file system again, in some specific
directory. How can I do this?

-> Will I have to use Map-Reduce or something else ot make it work.

-> Also, if I write map reduce code for all the complete activity, how will
I fetch the data or the files that are chunked in HDFS in the form of blocks
and combine(reassemble) them into a complete file and place it on a node;s
local filesystem who is not a part of hadoop cluster setup.

Eagerly waiting for reply!

Thanking You,
Sugandha!



-- 
Regards!
Sugandha

Re: which Java version for hadoop-0.19.1 ?

Posted by Roldano Cattoni <ca...@fbk.eu>.
Thanks again, Stuart.

I definitely need to search better ...

Best

  Roldano


On Wed, Jun 10, 2009 at 07:06:58PM +0200, Stuart White wrote:
> http://hadoop.apache.org/core/docs/r0.19.1/quickstart.html#Required+Software
> 
> 
> On Wed, Jun 10, 2009 at 12:02 PM, Roldano Cattoni <ca...@fbk.eu> wrote:
> 
> > It works, many thanks.
> >
> > Last question: is this information documented somewhere in the package? I
> > was not able to find it.
> >
> >
> >  Roldano
> >
> >
> >
> > On Wed, Jun 10, 2009 at 06:37:08PM +0200, Stuart White wrote:
> > > Java 1.6.
> > >
> > > On Wed, Jun 10, 2009 at 11:33 AM, Roldano Cattoni <ca...@fbk.eu>
> > wrote:
> > >
> > > > A very basic question: which Java version is required for
> > hadoop-0.19.1?
> > > >
> > > > With jre1.5.0_06 I get the error:
> > > >  java.lang.UnsupportedClassVersionError: Bad version number in .class
> > file
> > > >  at java.lang.ClassLoader.defineClass1(Native Method)
> > > >  (..)
> > > >
> > > > By the way hadoop-0.17.2.1 was running successfully with jre1.5.0_06
> > > >
> > > >
> > > > Thanks in advance for your kind help
> > > >
> > > >  Roldano
> > > >
> >

Re: which Java version for hadoop-0.19.1 ?

Posted by Stuart White <st...@gmail.com>.
http://hadoop.apache.org/core/docs/r0.19.1/quickstart.html#Required+Software


On Wed, Jun 10, 2009 at 12:02 PM, Roldano Cattoni <ca...@fbk.eu> wrote:

> It works, many thanks.
>
> Last question: is this information documented somewhere in the package? I
> was not able to find it.
>
>
>  Roldano
>
>
>
> On Wed, Jun 10, 2009 at 06:37:08PM +0200, Stuart White wrote:
> > Java 1.6.
> >
> > On Wed, Jun 10, 2009 at 11:33 AM, Roldano Cattoni <ca...@fbk.eu>
> wrote:
> >
> > > A very basic question: which Java version is required for
> hadoop-0.19.1?
> > >
> > > With jre1.5.0_06 I get the error:
> > >  java.lang.UnsupportedClassVersionError: Bad version number in .class
> file
> > >  at java.lang.ClassLoader.defineClass1(Native Method)
> > >  (..)
> > >
> > > By the way hadoop-0.17.2.1 was running successfully with jre1.5.0_06
> > >
> > >
> > > Thanks in advance for your kind help
> > >
> > >  Roldano
> > >
>

Re: which Java version for hadoop-0.19.1 ?

Posted by Roldano Cattoni <ca...@fbk.eu>.
It works, many thanks.

Last question: is this information documented somewhere in the package? I
was not able to find it.


  Roldano



On Wed, Jun 10, 2009 at 06:37:08PM +0200, Stuart White wrote:
> Java 1.6.
> 
> On Wed, Jun 10, 2009 at 11:33 AM, Roldano Cattoni <ca...@fbk.eu> wrote:
> 
> > A very basic question: which Java version is required for hadoop-0.19.1?
> >
> > With jre1.5.0_06 I get the error:
> >  java.lang.UnsupportedClassVersionError: Bad version number in .class file
> >  at java.lang.ClassLoader.defineClass1(Native Method)
> >  (..)
> >
> > By the way hadoop-0.17.2.1 was running successfully with jre1.5.0_06
> >
> >
> > Thanks in advance for your kind help
> >
> >  Roldano
> >

Re: which Java version for hadoop-0.19.1 ?

Posted by Stuart White <st...@gmail.com>.
Java 1.6.

On Wed, Jun 10, 2009 at 11:33 AM, Roldano Cattoni <ca...@fbk.eu> wrote:

> A very basic question: which Java version is required for hadoop-0.19.1?
>
> With jre1.5.0_06 I get the error:
>  java.lang.UnsupportedClassVersionError: Bad version number in .class file
>  at java.lang.ClassLoader.defineClass1(Native Method)
>  (..)
>
> By the way hadoop-0.17.2.1 was running successfully with jre1.5.0_06
>
>
> Thanks in advance for your kind help
>
>  Roldano
>

which Java version for hadoop-0.19.1 ?

Posted by Roldano Cattoni <ca...@fbk.eu>.
A very basic question: which Java version is required for hadoop-0.19.1?

With jre1.5.0_06 I get the error:
  java.lang.UnsupportedClassVersionError: Bad version number in .class file
  at java.lang.ClassLoader.defineClass1(Native Method)
  (..)

By the way hadoop-0.17.2.1 was running successfully with jre1.5.0_06 


Thanks in advance for your kind help

  Roldano

Re: Few Queries..!!!

Posted by tim robertson <ti...@gmail.com>.
Answers inline

> -> Once I place my data in HDFS, it gets replicated and chunked
> automatically over the datanodes. Right? Hadoop takes care of all those
> things.

Yes it does

> -> Now, if there is some third party who is not participating in the Hadoop
> program. Means, he is not one of the nodes of hadoop cluster. Now, he has
> some data on his local filesystem. Thus, can I place this data into HDFS?
> How?

Using the HDFS interface you would put the data in (not dissimilar to
FTP'ing it to a server)

> -> Then, now when, that third party asks for  a file or a direcory or any
> kind of data that was previously being dumped in HDFS without that third
> person's knowledge- he wnats it back(wants to retrieve it). Thus, the data
> should get placed on his local file system again, in some specific
> directory. How can I do this?

Copy it out of HDFS onto the local file system - it is pretty much
like copying from a mounted drive to a different mounted drive, just
that you go through a hadoop command (distcp) and not a native command
(cp)

> -> Will I have to use Map-Reduce or something else ot make it work.

No.  You could write a java program or use command line utilities.
Maybe you can do it in other languages but I only do Java...

> -> Also, if I write map reduce code for all the complete activity, how will
> I fetch the data or the files that are chunked in HDFS in the form of blocks
> and combine(reassemble) them into a complete file and place it on a node;s
> local filesystem who is not a part of hadoop cluster setup.

You don't write MR code for putting and retrieving files - this is all
done by Hadoop for you.  Just copy files in and copy files out.

It's probably worth you reading the Hadoop command line guide:
http://hadoop.apache.org/core/docs/r0.19.1/commands_manual.html to get
an understanding of what you can do from the command line.  All those
command line utilities you can use programmatically (e.g. from code)
as well.

Cheers,

Tim




>
> Eagerly waiting for reply!
>
> Thanking You,
> Sugandha!
>
>
>
> --
> Regards!
> Sugandha
>