You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Ravi Joshi <ra...@yahoo.com> on 2012/05/17 15:18:18 UTC

Unable to work with Hadoop 1.0.1 using eclipse-indigo

Hi, i recently downloaded and successfully installed hadoop-1.0.1 in my 
ubuntu 10.04 LTS. I have hadoop-1.0.1.tar.gz downloaded and now i want 
to design map-reduce application. As suggested by some blogs, first we 
should install eclipse plugin for hadoop, which is located inside 
contrib->eclipse plugin but in my hadoop-1.0.1.tar.gz, inside contrib
 directory no eclipse plugin is found. Inside contrib directory only 
datajoin, failmon, gridmix, hdfsproxy, hod, index, streaming and vaidya 
directories are present. when i looked over src->contrib, i can find 
eclipse plugin directory but no jar file.
I haven't work with hadoop 
under eclipse before, can somebody please explain me the plugin 
installation, so that i can start map-reduce development. 
Thanking you.

-Ravi Joshi

Fwd: Unable to work with Hadoop 1.0.1 using eclipse-indigo

Posted by Saravanan Nagarajan <sa...@gmail.com>.
Hi,

I tried the mentioned link,but i'm getting the below error.Could you send
me the eclipseplugin jar file to my mail ID or send me the correct link.

"Error (404)We can't find the page you're looking for. Check out our Help
Center <https://www.dropbox.com/help> andforums
<http://forums.dropbox.com/> for
help, or head back to home <https://www.dropbox.com/home>."

Regards,
Saravanan


On Fri, May 18, 2012 at 2:07 AM, Zhiwei Lin <zh...@gmail.com> wrote:

> Hi Ravi,
>
>
> There is a compiled plugin available.
>
> http://dl.dropbox.com/u/24999702/Apache/hadoop-eclipse-plugin-1.0.0.jar
>
> You can follow this link on how to use eclipse plugin
>
> http://v-lad.org/Tutorials/Hadoop/13.5%20-%20copy%20hadoop%20plugin.html
>
>
>
> On 17 May 2012 17:29, Ravi Joshi <ra...@yahoo.com> wrote:
>
> > Hi Jagat, I managed everything, now program is working. in the argument
> > initially i was writing ~/Desktop/input/doc ~/Desktop/output that was
> > giving error(don't know why!!) after that i changed it little bit
> > ./Input/doc ./output (and i moved input, output directories inside
> project
> > root directory.)
> >
> > Thank you.
> >
> > -Ravi Joshi
> >
> > --- On Thu, 17/5/12, Jagat <ja...@gmail.com> wrote:
> >
> > From: Jagat <ja...@gmail.com>
> > Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
> > To: common-user@hadoop.apache.org
> > Received: Thursday, 17 May, 2012, 3:46 PM
> >
> > Can you check why its saying
> >
> > input path does not exist:
> > file:/home/hduser/Desktop/Eclipse_Workspace/K-Means
> > Algorithm/~/Desktop/input/doc
> >
>
>
>
> --
>
> Best wishes.
>
> Zhiwei
>

Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo

Posted by Zhiwei Lin <zh...@gmail.com>.
Hi Ravi,


There is a compiled plugin available.

http://dl.dropbox.com/u/24999702/Apache/hadoop-eclipse-plugin-1.0.0.jar

You can follow this link on how to use eclipse plugin

http://v-lad.org/Tutorials/Hadoop/13.5%20-%20copy%20hadoop%20plugin.html



On 17 May 2012 17:29, Ravi Joshi <ra...@yahoo.com> wrote:

> Hi Jagat, I managed everything, now program is working. in the argument
> initially i was writing ~/Desktop/input/doc ~/Desktop/output that was
> giving error(don't know why!!) after that i changed it little bit
> ./Input/doc ./output (and i moved input, output directories inside project
> root directory.)
>
> Thank you.
>
> -Ravi Joshi
>
> --- On Thu, 17/5/12, Jagat <ja...@gmail.com> wrote:
>
> From: Jagat <ja...@gmail.com>
> Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
> To: common-user@hadoop.apache.org
> Received: Thursday, 17 May, 2012, 3:46 PM
>
> Can you check why its saying
>
> input path does not exist:
> file:/home/hduser/Desktop/Eclipse_Workspace/K-Means
> Algorithm/~/Desktop/input/doc
>



-- 

Best wishes.

Zhiwei

Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo

Posted by Ravi Joshi <ra...@yahoo.com>.
Hi Jagat, I managed everything, now program is working. in the argument initially i was writing ~/Desktop/input/doc ~/Desktop/output that was giving error(don't know why!!) after that i changed it little bit ./Input/doc ./output (and i moved input, output directories inside project root directory.)

Thank you.

-Ravi Joshi

--- On Thu, 17/5/12, Jagat <ja...@gmail.com> wrote:

From: Jagat <ja...@gmail.com>
Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
To: common-user@hadoop.apache.org
Received: Thursday, 17 May, 2012, 3:46 PM

Can you check why its saying

input path does not exist:
file:/home/hduser/Desktop/Eclipse_Workspace/K-Means
Algorithm/~/Desktop/input/doc

Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo

Posted by Jagat <ja...@gmail.com>.
Can you check why its saying

input path does not exist:
file:/home/hduser/Desktop/Eclipse_Workspace/K-Means
Algorithm/~/Desktop/input/doc

Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo

Posted by Harsh J <ha...@cloudera.com>.
Ravi,

~ in ~/Desktop/input/doc isn't resolvable by the code AFAIK. A shell
usually resolves that, and you seem to be running it from Eclipse
(which hence, wouldn't resolve it). So rather provide an absolute path
as input arguments.

On Thu, May 17, 2012 at 8:37 PM, Ravi Joshi <ra...@yahoo.com> wrote:
> Now i added all the jar files which came up with hadoop-1.0.1.tar.gz package. But some new errors are showing. This time i am following wordCount v2 (http://hadoop.apache.org/common/docs/current/mapred_tutorial.html#Example%3a+WordCount+v2.0). Following is the error.
>
> 12/05/17 20:31:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 12/05/17 20:31:43 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
> 12/05/17 20:31:43 INFO mapred.JobClient: Cleaning up the staging area file:/tmp/hadoop-hduser/mapred/staging/hduser-481041798/.staging/job_local_0001
> 12/05/17 20:31:43 ERROR security.UserGroupInformation: PriviledgedActionException as:hduser cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/home/hduser/Desktop/Eclipse_Workspace/K-Means Algorithm/~/Desktop/input/doc
> Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/home/hduser/Desktop/Eclipse_Workspace/K-Means Algorithm/~/Desktop/input/doc
>     at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
>     at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
>     at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:989)
>     at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:981)
>     at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
>     at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897)
>     at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
>     at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
>     at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
>     at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1261)
>     at test.WordCount.run(WordCount.java:131)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>     at test.WordCount.main(WordCount.java:136)
>
>
> -Ravi Joshi
>
> --- On Thu, 17/5/12, Ravi Joshi <ra...@yahoo.com> wrote:
>
> From: Ravi Joshi <ra...@yahoo.com>
> Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
> To: common-user@hadoop.apache.org
> Received: Thursday, 17 May, 2012, 2:16 PM
>
> Hi, i added hadoop-core-1.0.1.jar in the project class path. i am testing wordcount (http://hadoop.apache.org/common/docs/current/mapred_tutorial.html#Example%3A+WordCount+v1.0) but when i try to run my WordCount.java in eclipse, it shows the following errors-
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
>     at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:143)
>     at test.WordCount.main(WordCount.java:56)
> Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>     ... 2 more
>
> -Ravi Joshi
>
> --- On Thu, 17/5/12, Ravi Joshi <ra...@yahoo.com> wrote:
>
> From: Ravi Joshi <ra...@yahoo.com>
> Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
> To: common-user@hadoop.apache.org
> Received: Thursday, 17 May, 2012, 1:37 PM
>
> Hi Jagat, Thank you so much for answering the question. Can you please tell me all the names with location of jar files, which must be added in the project? I am using hadoop-1.0.1 in Eclipse Indigo on Ubuntu10.04 LTS.
> Thank you.
>
> -Ravi Joshi
>
> --- On Thu, 17/5/12, Jagat <ja...@gmail.com> wrote:
>
> From: Jagat <ja...@gmail.com>
> Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
> To: common-user@hadoop.apache.org
> Received: Thursday, 17 May, 2012, 1:32 PM
>
> Hello Ravi
>
> To create map reduce programs plugin is not mandatory.
>
> Just download Hadoop
> Create one Java project in Eclipse
> Add jar files from Home folder of Hadoop ( from share folder in Hadoop 2.x
> ) to project class path
> Create new mapper class and Reducer class , Driver class
> Run it
>
>
>
>
>
> On Thu, May 17, 2012 at 6:48 PM, Ravi Joshi <ra...@yahoo.com> wrote:
>
>> Hi, i recently downloaded and successfully installed hadoop-1.0.1 in my
>> ubuntu 10.04 LTS. I have hadoop-1.0.1.tar.gz downloaded and now i want
>> to design map-reduce application. As suggested by some blogs, first we
>> should install eclipse plugin for hadoop, which is located inside
>> contrib->eclipse plugin but in my hadoop-1.0.1.tar.gz, inside contrib
>>  directory no eclipse plugin is found. Inside contrib directory only
>> datajoin, failmon, gridmix, hdfsproxy, hod, index, streaming and vaidya
>> directories are present. when i looked over src->contrib, i can find
>> eclipse plugin directory but no jar file.
>> I haven't work with hadoop
>> under eclipse before, can somebody please explain me the plugin
>> installation, so that i can start map-reduce development.
>> Thanking you.
>>
>> -Ravi Joshi



-- 
Harsh J

Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo

Posted by Ravi Joshi <ra...@yahoo.com>.
Now i added all the jar files which came up with hadoop-1.0.1.tar.gz package. But some new errors are showing. This time i am following wordCount v2 (http://hadoop.apache.org/common/docs/current/mapred_tutorial.html#Example%3a+WordCount+v2.0). Following is the error.

12/05/17 20:31:43 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
12/05/17 20:31:43 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
12/05/17 20:31:43 INFO mapred.JobClient: Cleaning up the staging area file:/tmp/hadoop-hduser/mapred/staging/hduser-481041798/.staging/job_local_0001
12/05/17 20:31:43 ERROR security.UserGroupInformation: PriviledgedActionException as:hduser cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/home/hduser/Desktop/Eclipse_Workspace/K-Means Algorithm/~/Desktop/input/doc
Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/home/hduser/Desktop/Eclipse_Workspace/K-Means Algorithm/~/Desktop/input/doc
    at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
    at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
    at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:989)
    at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:981)
    at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1261)
    at test.WordCount.run(WordCount.java:131)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at test.WordCount.main(WordCount.java:136)


-Ravi Joshi

--- On Thu, 17/5/12, Ravi Joshi <ra...@yahoo.com> wrote:

From: Ravi Joshi <ra...@yahoo.com>
Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
To: common-user@hadoop.apache.org
Received: Thursday, 17 May, 2012, 2:16 PM

Hi, i added hadoop-core-1.0.1.jar in the project class path. i am testing wordcount (http://hadoop.apache.org/common/docs/current/mapred_tutorial.html#Example%3A+WordCount+v1.0) but when i try to run my WordCount.java in eclipse, it shows the following errors-


Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
    at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:143)
    at test.WordCount.main(WordCount.java:56)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
    at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
    ... 2 more 

-Ravi Joshi

--- On Thu, 17/5/12, Ravi Joshi <ra...@yahoo.com> wrote:

From: Ravi Joshi <ra...@yahoo.com>
Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
To: common-user@hadoop.apache.org
Received: Thursday, 17 May, 2012, 1:37 PM

Hi Jagat, Thank you so much for answering the question. Can you please tell me all the names with location of jar files, which must be added in the project? I am using hadoop-1.0.1 in Eclipse Indigo on Ubuntu10.04 LTS.
Thank you.

-Ravi Joshi

--- On Thu, 17/5/12, Jagat <ja...@gmail.com> wrote:

From: Jagat <ja...@gmail.com>
Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
To: common-user@hadoop.apache.org
Received: Thursday, 17 May, 2012, 1:32 PM

Hello Ravi

To create map reduce programs plugin is not mandatory.

Just download Hadoop
Create one Java project in Eclipse
Add jar files from Home folder of Hadoop ( from share folder in Hadoop 2.x
) to project class path
Create new mapper class and Reducer class , Driver class
Run it





On Thu, May 17, 2012 at 6:48 PM, Ravi Joshi <ra...@yahoo.com> wrote:

> Hi, i recently downloaded and successfully installed hadoop-1.0.1 in my
> ubuntu 10.04 LTS. I have hadoop-1.0.1.tar.gz downloaded and now i want
> to design map-reduce application. As suggested by some blogs, first we
> should install eclipse plugin for hadoop, which is located inside
> contrib->eclipse plugin but in my hadoop-1.0.1.tar.gz, inside contrib
>  directory no eclipse plugin is found. Inside contrib directory only
> datajoin, failmon, gridmix, hdfsproxy, hod, index, streaming and vaidya
> directories are present. when i looked over src->contrib, i can find
> eclipse plugin directory but no jar file.
> I haven't work with hadoop
> under eclipse before, can somebody please explain me the plugin
> installation, so that i can start map-reduce development.
> Thanking you.
>
> -Ravi Joshi

Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo

Posted by Ravi Joshi <ra...@yahoo.com>.
Hi, i added hadoop-core-1.0.1.jar in the project class path. i am testing wordcount (http://hadoop.apache.org/common/docs/current/mapred_tutorial.html#Example%3A+WordCount+v1.0) but when i try to run my WordCount.java in eclipse, it shows the following errors-


Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
    at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:143)
    at test.WordCount.main(WordCount.java:56)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
    at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
    ... 2 more 

-Ravi Joshi

--- On Thu, 17/5/12, Ravi Joshi <ra...@yahoo.com> wrote:

From: Ravi Joshi <ra...@yahoo.com>
Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
To: common-user@hadoop.apache.org
Received: Thursday, 17 May, 2012, 1:37 PM

Hi Jagat, Thank you so much for answering the question. Can you please tell me all the names with location of jar files, which must be added in the project? I am using hadoop-1.0.1 in Eclipse Indigo on Ubuntu10.04 LTS.
Thank you.

-Ravi Joshi

--- On Thu, 17/5/12, Jagat <ja...@gmail.com> wrote:

From: Jagat <ja...@gmail.com>
Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
To: common-user@hadoop.apache.org
Received: Thursday, 17 May, 2012, 1:32 PM

Hello Ravi

To create map reduce programs plugin is not mandatory.

Just download Hadoop
Create one Java project in Eclipse
Add jar files from Home folder of Hadoop ( from share folder in Hadoop 2.x
) to project class path
Create new mapper class and Reducer class , Driver class
Run it





On Thu, May 17, 2012 at 6:48 PM, Ravi Joshi <ra...@yahoo.com> wrote:

> Hi, i recently downloaded and successfully installed hadoop-1.0.1 in my
> ubuntu 10.04 LTS. I have hadoop-1.0.1.tar.gz downloaded and now i want
> to design map-reduce application. As suggested by some blogs, first we
> should install eclipse plugin for hadoop, which is located inside
> contrib->eclipse plugin but in my hadoop-1.0.1.tar.gz, inside contrib
>  directory no eclipse plugin is found. Inside contrib directory only
> datajoin, failmon, gridmix, hdfsproxy, hod, index, streaming and vaidya
> directories are present. when i looked over src->contrib, i can find
> eclipse plugin directory but no jar file.
> I haven't work with hadoop
> under eclipse before, can somebody please explain me the plugin
> installation, so that i can start map-reduce development.
> Thanking you.
>
> -Ravi Joshi

Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo

Posted by Ravi Joshi <ra...@yahoo.com>.
Hi Jagat, Thank you so much for answering the question. Can you please tell me all the names with location of jar files, which must be added in the project? I am using hadoop-1.0.1 in Eclipse Indigo on Ubuntu10.04 LTS.
Thank you.

-Ravi Joshi

--- On Thu, 17/5/12, Jagat <ja...@gmail.com> wrote:

From: Jagat <ja...@gmail.com>
Subject: Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo
To: common-user@hadoop.apache.org
Received: Thursday, 17 May, 2012, 1:32 PM

Hello Ravi

To create map reduce programs plugin is not mandatory.

Just download Hadoop
Create one Java project in Eclipse
Add jar files from Home folder of Hadoop ( from share folder in Hadoop 2.x
) to project class path
Create new mapper class and Reducer class , Driver class
Run it





On Thu, May 17, 2012 at 6:48 PM, Ravi Joshi <ra...@yahoo.com> wrote:

> Hi, i recently downloaded and successfully installed hadoop-1.0.1 in my
> ubuntu 10.04 LTS. I have hadoop-1.0.1.tar.gz downloaded and now i want
> to design map-reduce application. As suggested by some blogs, first we
> should install eclipse plugin for hadoop, which is located inside
> contrib->eclipse plugin but in my hadoop-1.0.1.tar.gz, inside contrib
>  directory no eclipse plugin is found. Inside contrib directory only
> datajoin, failmon, gridmix, hdfsproxy, hod, index, streaming and vaidya
> directories are present. when i looked over src->contrib, i can find
> eclipse plugin directory but no jar file.
> I haven't work with hadoop
> under eclipse before, can somebody please explain me the plugin
> installation, so that i can start map-reduce development.
> Thanking you.
>
> -Ravi Joshi

Re: Unable to work with Hadoop 1.0.1 using eclipse-indigo

Posted by Jagat <ja...@gmail.com>.
Hello Ravi

To create map reduce programs plugin is not mandatory.

Just download Hadoop
Create one Java project in Eclipse
Add jar files from Home folder of Hadoop ( from share folder in Hadoop 2.x
) to project class path
Create new mapper class and Reducer class , Driver class
Run it





On Thu, May 17, 2012 at 6:48 PM, Ravi Joshi <ra...@yahoo.com> wrote:

> Hi, i recently downloaded and successfully installed hadoop-1.0.1 in my
> ubuntu 10.04 LTS. I have hadoop-1.0.1.tar.gz downloaded and now i want
> to design map-reduce application. As suggested by some blogs, first we
> should install eclipse plugin for hadoop, which is located inside
> contrib->eclipse plugin but in my hadoop-1.0.1.tar.gz, inside contrib
>  directory no eclipse plugin is found. Inside contrib directory only
> datajoin, failmon, gridmix, hdfsproxy, hod, index, streaming and vaidya
> directories are present. when i looked over src->contrib, i can find
> eclipse plugin directory but no jar file.
> I haven't work with hadoop
> under eclipse before, can somebody please explain me the plugin
> installation, so that i can start map-reduce development.
> Thanking you.
>
> -Ravi Joshi