You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Bill Brune <bb...@decarta.com> on 2009/11/20 22:01:04 UTC

bad connect ack

Hi I'm trying to get a small cluster up with hdfs (hadoop 0.20.1)

It is working somewhat.  I can create files and directories and retrieve 
them.  However I am seeing a random failure rate (about 25%) where I get
this error when storing a file, and the resultant file is stored as a 0 
len file (as reported by hadoop fs -ls).
I set this all up with straight IP addresses because I have no access to 
the DNS servers here. 
However the datanodes do have hostnames that do NOT resolve in DNS.  
(not sure if that matters)


Put failed: 09/11/19 15:40:09 INFO hdfs.DFSClient: Exception in 
createBlockOutputStream java.io.IOException: Bad connect ack with 
firstBadLink 10.241.4.101:50010

09/11/19 15:40:09 INFO hdfs.DFSClient: Abandoning block 
blk_8325005803148307980_1082

-etc,etc

I've verified that the java process is running and listening on port 
50010 (as reported by netstat -plnt) and passphraseless ssh is working fine.
(also, the namenode web page intermittently fails to connect when asked 
to browse the filesystem, (works most of the time)

The logs on that datanode show a java.net.NoRouteToHostException for a 
few blocks, then it seems to start receiving blocks fine.

The log ....


2009-11-19 06:12:30,149 INFO 
org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block 
blk_6892009382281192058_1068
 src: /10.241.4.101:41570 dest: /10.241.4.101:50010
2009-11-19 06:12:30,150 INFO 
org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock 
blk_6892009382281192058_1068 rece
ived exception java.net.NoRouteToHostException: No route to host
2009-11-19 06:12:30,151 ERROR 
org.apache.hadoop.hdfs.server.datanode.DataNode: 
DatanodeRegistration(10.241.4.101:50010, st
orageID=DS-886003042-127.0.0.1-50010-1258634370537, infoPort=50075, 
ipcPort=50020):DataXceiver
java.net.NoRouteToHostException: No route to host
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
        at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
        at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:282)
        at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)
        at java.lang.Thread.run(Unknown Source)
2009-11-19 06:12:36,155 INFO 
org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block 
blk_-7007024210780549144_106
8 src: /10.241.4.101:41574 dest: /10.241.4.101:50010
2009-11-19 06:12:36,157 INFO 
org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock 
blk_-7007024210780549144_1068 rec
eived exception java.net.NoRouteToHostException: No route to host
2.net.NoRouteToHostException: No route to host
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
        at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
        at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:282)
        at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)
        at java.lang.Thread.run(Unknown Source)
2009-11-19 06:12:30,149 INFO 
org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block 
blk_6892009382281192058_1068
 src: /10.241.4.101:41570 dest: /10.241.4.101:50010
2009-11-19 06:12:30,150 INFO 
org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock 
blk_6892009382281192058_1068 rece
ived exception java.net.NoRouteToHostException: No route to host
2009-11-19 06:12:30,151 ERROR 
org.apache.hadoop.hdfs.server.datanode.DataNode: 
DatanodeRegistration(10.241.4.101:50010, 
storageID=DS-886003042-127.0.0.1-50010-1258634370537, infoPort=50075, 
ipcPort=50020):DataXceiver
java.net.NoRouteToHostException: No route to host
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
        at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
        at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:282)
        at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)
        at java.lang.Thread.run(Unknown Source)
2009-11-19 06:12:36,155 INFO 
org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block 
blk_-7007024210780549144_1068 src: /10.241.4.101:41574 dest: 
/10.241.4.101:50010
2009-11-19 06:12:36,157 INFO 
org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock 
blk_-7007024210780549144_1068 received exception 
java.net.NoRouteToHostException: No route to host
2009-11-19 06:12:36,157 ERROR 
org.apache.hadoop.hdfs.server.datanode.DataNode: 
DatanodeRegistration(10.241.4.101:50010, 
storageID=DS-886003042-127.0.0.1-50010-1258634370537, infoPort=50075, 
ipcPort=50020):DataXceiver
java.net.NoRouteToHostException: No route to host
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
        at 
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
        at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:282)
        at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)
        at java.lang.Thread.run(Unknown Source)
2009-11-19 06:15:31,776 INFO 
org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification 
succeeded for blk_6602943802956432390_1059
2009-11-19 06:37:06,201 INFO 
org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification 
succeeded for blk_-3274535555692844186_1063
2009-11-19 06:45:15,117 INFO 
org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification 
succeeded for blk_753873727864345115_1067
2009-11-19 06:51:56,888 INFO 
org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification 
succeeded for blk_-810392349062598446_1062
2009-11-19 07:03:17,441 INFO 
org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification 
succeeded for blk_-4850126846096803878_1066
2009-11-19 07:09:26,335 INFO 
org.apache.hadoop.hdfs.server.datanode.DataNode: BlockReport of 12 
blocks got processed in 3
msecs



Any ideas would be greatly appreciated.

Thanks  -Bill

Re: bad connect ack

Posted by Jason Venner <ja...@gmail.com>.
At one point my admin staff added 5 machines to my cluster but accidently
left port 50010 among onthers firewalled.
This resulted in chaos for a while until the firewall was found.

On Fri, Nov 20, 2009 at 1:01 PM, Bill Brune <bb...@decarta.com> wrote:

>
> Hi I'm trying to get a small cluster up with hdfs (hadoop 0.20.1)
>
> It is working somewhat.  I can create files and directories and retrieve
> them.  However I am seeing a random failure rate (about 25%) where I get
> this error when storing a file, and the resultant file is stored as a 0 len
> file (as reported by hadoop fs -ls).
> I set this all up with straight IP addresses because I have no access to
> the DNS servers here. However the datanodes do have hostnames that do NOT
> resolve in DNS.  (not sure if that matters)
>
>
> Put failed: 09/11/19 15:40:09 INFO hdfs.DFSClient: Exception in
> createBlockOutputStream java.io.IOException: Bad connect ack with
> firstBadLink 10.241.4.101:50010
>
> 09/11/19 15:40:09 INFO hdfs.DFSClient: Abandoning block
> blk_8325005803148307980_1082
>
> -etc,etc
>
> I've verified that the java process is running and listening on port 50010
> (as reported by netstat -plnt) and passphraseless ssh is working fine.
> (also, the namenode web page intermittently fails to connect when asked to
> browse the filesystem, (works most of the time)
>
> The logs on that datanode show a java.net.NoRouteToHostException for a few
> blocks, then it seems to start receiving blocks fine.
>
> The log ....
>
>
> 2009-11-19 06:12:30,149 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> blk_6892009382281192058_1068
> src: /10.241.4.101:41570 dest: /10.241.4.101:50010
> 2009-11-19 06:12:30,150 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock
> blk_6892009382281192058_1068 rece
> ived exception java.net.NoRouteToHostException: No route to host
> 2009-11-19 06:12:30,151 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(
> 10.241.4.101:50010, st
> orageID=DS-886003042-127.0.0.1-50010-1258634370537, infoPort=50075,
> ipcPort=50020):DataXceiver
> java.net.NoRouteToHostException: No route to host
>       at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>       at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
>       at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>       at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:282)
>       at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)
>       at java.lang.Thread.run(Unknown Source)
> 2009-11-19 06:12:36,155 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> blk_-7007024210780549144_106
> 8 src: /10.241.4.101:41574 dest: /10.241.4.101:50010
> 2009-11-19 06:12:36,157 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock
> blk_-7007024210780549144_1068 rec
> eived exception java.net.NoRouteToHostException: No route to host
> 2.net.NoRouteToHostException: No route to host
>       at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>       at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
>       at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>       at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:282)
>       at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)
>       at java.lang.Thread.run(Unknown Source)
> 2009-11-19 06:12:30,149 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> blk_6892009382281192058_1068
> src: /10.241.4.101:41570 dest: /10.241.4.101:50010
> 2009-11-19 06:12:30,150 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock
> blk_6892009382281192058_1068 rece
> ived exception java.net.NoRouteToHostException: No route to host
> 2009-11-19 06:12:30,151 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(
> 10.241.4.101:50010, storageID=DS-886003042-127.0.0.1-50010-1258634370537,
> infoPort=50075, ipcPort=50020):DataXceiver
> java.net.NoRouteToHostException: No route to host
>       at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>       at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
>       at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>       at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:282)
>       at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)
>       at java.lang.Thread.run(Unknown Source)
> 2009-11-19 06:12:36,155 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> blk_-7007024210780549144_1068 src: /10.241.4.101:41574 dest: /
> 10.241.4.101:50010
> 2009-11-19 06:12:36,157 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock
> blk_-7007024210780549144_1068 received exception
> java.net.NoRouteToHostException: No route to host
> 2009-11-19 06:12:36,157 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(
> 10.241.4.101:50010, storageID=DS-886003042-127.0.0.1-50010-1258634370537,
> infoPort=50075, ipcPort=50020):DataXceiver
> java.net.NoRouteToHostException: No route to host
>       at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>       at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
>       at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>       at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404)
>       at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:282)
>       at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:103)
>       at java.lang.Thread.run(Unknown Source)
> 2009-11-19 06:15:31,776 INFO
> org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification
> succeeded for blk_6602943802956432390_1059
> 2009-11-19 06:37:06,201 INFO
> org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification
> succeeded for blk_-3274535555692844186_1063
> 2009-11-19 06:45:15,117 INFO
> org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification
> succeeded for blk_753873727864345115_1067
> 2009-11-19 06:51:56,888 INFO
> org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification
> succeeded for blk_-810392349062598446_1062
> 2009-11-19 07:03:17,441 INFO
> org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Verification
> succeeded for blk_-4850126846096803878_1066
> 2009-11-19 07:09:26,335 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: BlockReport of 12 blocks
> got processed in 3
> msecs
>
>
>
> Any ideas would be greatly appreciated.
>
> Thanks  -Bill
>



-- 
Pro Hadoop, a book to guide you from beginner to hadoop mastery,
http://www.amazon.com/dp/1430219424?tag=jewlerymall
www.prohadoopbook.com a community for Hadoop Professionals

Re: build / install hadoop plugin question

Posted by z3r0c001 <ce...@gmail.com>.
You need "ant" to build.  There are plethora of tutorials for  
installing and setting up ant out there like
http://ant.apache.org/manual/install.htm
http://ant.apache.org/manual/install.htm

Once you have it setup, go to extracted dir for hadoop using command  
prompt/shell and run "ant eclipse"

And to see more build options "ant -p"

-d


On Nov 20, 2009, at 7:23 PM, Raymond Jennings III  
<ra...@yahoo.com> wrote:

> Could you explain further on how to do this.  I have never built a  
> plugin before.  Do I do this from within eclipse?  Thanks!
>
> --- On Fri, 11/20/09, Dhaivat Pandit <ce...@gmail.com> wrote:
>
>> From: Dhaivat Pandit <ce...@gmail.com>
>> Subject: Re: build / install hadoop plugin question
>> To: "common-user@hadoop.apache.org" <co...@hadoop.apache.org>
>> Date: Friday, November 20, 2009, 9:53 PM
>> Yes if it's not built you can do "ant
>> eclipse". It will geerate the plugin jar and you can paste
>> it in plugin directory.
>>
>> -dp
>>
>>
>> On Nov 20, 2009, at 6:49 PM, Raymond Jennings III <raymondjiii@yahoo.com 
>> >
>> wrote:
>>
>>> That's what I would normally do for a plugin but this
>> has a sub-directory of "eclipse-plugin" (and not plugins)
>> and the files are all java files and not class files.
>> This in the hadoop directory of
>> src/contrib/eclipse-plugin.  It looks to me like it has
>> to be built first and then copied into the plugins
>> directory?
>>>
>>> --- On Fri, 11/20/09, Dhaivat Pandit <ce...@gmail.com>
>> wrote:
>>>
>>>> From: Dhaivat Pandit <ce...@gmail.com>
>>>> Subject: Re: build / install hadoop plugin
>> question
>>>> To: "common-user@hadoop.apache.org"
>> <co...@hadoop.apache.org>
>>>> Date: Friday, November 20, 2009, 9:05 PM
>>>> Just paste it in eclipse installation
>>>> plugins folder and restart eclipse
>>>>
>>>> -dp
>>>>
>>>>
>>>> On Nov 20, 2009, at 2:08 PM, Raymond Jennings III
>> <ra...@yahoo.com>
>>>> wrote:
>>>>
>>>>> The plugin that is included in the hadoop
>> distribution
>>>> under src/contrib/eclipse-plugin - how does that
>> get
>>>> installed as it does not appear to be in a
>> standard plugin
>>>> format.  Do I have to build it first and if
>> so can you
>>>> tell me how.  Thanks.  Ray
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>>
>>
>
>
>

Re: build / install hadoop plugin question

Posted by Raymond Jennings III <ra...@yahoo.com>.
Could you explain further on how to do this.  I have never built a plugin before.  Do I do this from within eclipse?  Thanks!

--- On Fri, 11/20/09, Dhaivat Pandit <ce...@gmail.com> wrote:

> From: Dhaivat Pandit <ce...@gmail.com>
> Subject: Re: build / install hadoop plugin question
> To: "common-user@hadoop.apache.org" <co...@hadoop.apache.org>
> Date: Friday, November 20, 2009, 9:53 PM
> Yes if it's not built you can do "ant
> eclipse". It will geerate the plugin jar and you can paste
> it in plugin directory.
> 
> -dp
> 
> 
> On Nov 20, 2009, at 6:49 PM, Raymond Jennings III <ra...@yahoo.com>
> wrote:
> 
> > That's what I would normally do for a plugin but this
> has a sub-directory of "eclipse-plugin" (and not plugins)
> and the files are all java files and not class files. 
> This in the hadoop directory of
> src/contrib/eclipse-plugin.  It looks to me like it has
> to be built first and then copied into the plugins
> directory?
> > 
> > --- On Fri, 11/20/09, Dhaivat Pandit <ce...@gmail.com>
> wrote:
> > 
> >> From: Dhaivat Pandit <ce...@gmail.com>
> >> Subject: Re: build / install hadoop plugin
> question
> >> To: "common-user@hadoop.apache.org"
> <co...@hadoop.apache.org>
> >> Date: Friday, November 20, 2009, 9:05 PM
> >> Just paste it in eclipse installation
> >> plugins folder and restart eclipse
> >> 
> >> -dp
> >> 
> >> 
> >> On Nov 20, 2009, at 2:08 PM, Raymond Jennings III
> <ra...@yahoo.com>
> >> wrote:
> >> 
> >>> The plugin that is included in the hadoop
> distribution
> >> under src/contrib/eclipse-plugin - how does that
> get
> >> installed as it does not appear to be in a
> standard plugin
> >> format.  Do I have to build it first and if
> so can you
> >> tell me how.  Thanks.  Ray
> >>> 
> >>> 
> >>> 
> >> 
> > 
> > 
> > 
> 


      

Re: build / install hadoop plugin question

Posted by Dhaivat Pandit <ce...@gmail.com>.
Yes if it's not built you can do "ant eclipse". It will geerate the  
plugin jar and you can paste it in plugin directory.

-dp


On Nov 20, 2009, at 6:49 PM, Raymond Jennings III  
<ra...@yahoo.com> wrote:

> That's what I would normally do for a plugin but this has a sub- 
> directory of "eclipse-plugin" (and not plugins) and the files are  
> all java files and not class files.  This in the hadoop directory of  
> src/contrib/eclipse-plugin.  It looks to me like it has to be built  
> first and then copied into the plugins directory?
>
> --- On Fri, 11/20/09, Dhaivat Pandit <ce...@gmail.com> wrote:
>
>> From: Dhaivat Pandit <ce...@gmail.com>
>> Subject: Re: build / install hadoop plugin question
>> To: "common-user@hadoop.apache.org" <co...@hadoop.apache.org>
>> Date: Friday, November 20, 2009, 9:05 PM
>> Just paste it in eclipse installation
>> plugins folder and restart eclipse
>>
>> -dp
>>
>>
>> On Nov 20, 2009, at 2:08 PM, Raymond Jennings III <raymondjiii@yahoo.com 
>> >
>> wrote:
>>
>>> The plugin that is included in the hadoop distribution
>> under src/contrib/eclipse-plugin - how does that get
>> installed as it does not appear to be in a standard plugin
>> format.  Do I have to build it first and if so can you
>> tell me how.  Thanks.  Ray
>>>
>>>
>>>
>>
>
>
>

Re: build / install hadoop plugin question

Posted by Raymond Jennings III <ra...@yahoo.com>.
That's what I would normally do for a plugin but this has a sub-directory of "eclipse-plugin" (and not plugins) and the files are all java files and not class files.  This in the hadoop directory of src/contrib/eclipse-plugin.  It looks to me like it has to be built first and then copied into the plugins directory?

--- On Fri, 11/20/09, Dhaivat Pandit <ce...@gmail.com> wrote:

> From: Dhaivat Pandit <ce...@gmail.com>
> Subject: Re: build / install hadoop plugin question
> To: "common-user@hadoop.apache.org" <co...@hadoop.apache.org>
> Date: Friday, November 20, 2009, 9:05 PM
> Just paste it in eclipse installation
> plugins folder and restart eclipse
> 
> -dp
> 
> 
> On Nov 20, 2009, at 2:08 PM, Raymond Jennings III <ra...@yahoo.com>
> wrote:
> 
> > The plugin that is included in the hadoop distribution
> under src/contrib/eclipse-plugin - how does that get
> installed as it does not appear to be in a standard plugin
> format.  Do I have to build it first and if so can you
> tell me how.  Thanks.  Ray
> > 
> > 
> > 
> 


      

Re: build / install hadoop plugin question

Posted by Dhaivat Pandit <ce...@gmail.com>.
Just paste it in eclipse installation plugins folder and restart eclipse

-dp


On Nov 20, 2009, at 2:08 PM, Raymond Jennings III  
<ra...@yahoo.com> wrote:

> The plugin that is included in the hadoop distribution under src/ 
> contrib/eclipse-plugin - how does that get installed as it does not  
> appear to be in a standard plugin format.  Do I have to build it  
> first and if so can you tell me how.  Thanks.  Ray
>
>
>

build / install hadoop plugin question

Posted by Raymond Jennings III <ra...@yahoo.com>.
The plugin that is included in the hadoop distribution under src/contrib/eclipse-plugin - how does that get installed as it does not appear to be in a standard plugin format.  Do I have to build it first and if so can you tell me how.  Thanks.  Ray