You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Erik Holstad <er...@gmail.com> on 2009/02/18 21:16:02 UTC

Probelms getting Eclipse Hadoop plugin to work.

I'm using Eclipse 3.3.2 and want to view my remote cluster using the Hadoop
plugin.
Everything shows up and I can see the map/reduce perspective but when trying
to
connect to a location I get:
"Error: Call failed on local exception"

I've set the host to for example xx0, where xx0 is a remote machine
accessible from
the terminal, and the ports to 50020/50040 for M/R master and
DFS master respectively. Is there anything I'm missing to set for remote
access to the
Hadoop cluster?

Regards Erik

Re: Hadoop build error

Posted by Matei Zaharia <ma...@cloudera.com>.
Forrest is used just for building documentation, by the way. If you want to
compile the Hadoop core jars you can do ant jar and it won't require
Forrest.

On Fri, Feb 20, 2009 at 10:41 PM, Abdul Qadeer <qa...@gmail.com>wrote:

> >
> >
> >
> > java5.check:
> >
> > BUILD FAILED
> > /home/raghu/src-hadoop/trunk/build.xml:890: 'java5.home' is not defined.
> >  Forrest requires Java5.  Please pass -Djava5.home=<base of Java 5
> > distribution> to Ant on the command-line.
>
>
>
> I think the error is self-explanatory.  Forrest need JDK1.5 and you can
> pass
> it using -Djava5.home argument.
> May be something like the following:
>
> ant -Djavac.args="-Xlint  -Xmaxwarns 1000"  -Djava5.home={base of Java 5
> distribution} tar
>

Re: Hadoop build error

Posted by Abdul Qadeer <qa...@gmail.com>.
>
>
>
> java5.check:
>
> BUILD FAILED
> /home/raghu/src-hadoop/trunk/build.xml:890: 'java5.home' is not defined.
>  Forrest requires Java5.  Please pass -Djava5.home=<base of Java 5
> distribution> to Ant on the command-line.



I think the error is self-explanatory.  Forrest need JDK1.5 and you can pass
it using -Djava5.home argument.
May be something like the following:

ant -Djavac.args="-Xlint  -Xmaxwarns 1000"  -Djava5.home={base of Java 5
distribution} tar

Hadoop build error

Posted by raghu kishor <ra...@yahoo.com>.
Hi ,

While trying to compile hadoop source (ant -Djavac.args="-Xlint
-Xmaxwarns 1000"  tar)  i  get below error . Kindly let me know how
to fix this issue. 
I have java 6 installed .

[javadoc] Standard Doclet version 1.6.0_07
  [javadoc] Building tree for all the packages and classes...
  [javadoc] Building index for all the packages and classes...
  [javadoc] Building index for all classes...

java5.check:

BUILD FAILED
/home/raghu/src-hadoop/trunk/build.xml:890: 'java5.home' is not defined.  Forrest requires Java5.  Please pass -Djava5.home=<base of Java 5 distribution> to Ant on the command-line.


Thanks,
Raghu


      

Re: Probelms getting Eclipse Hadoop plugin to work.

Posted by Erik Holstad <er...@gmail.com>.
Hi guys!
Thanks for your help, but still no luck, I did try to set it up on a
different machine with Eclipse 3.2.2 and the
IBM plugin instead of the Hadoop one, in that one I only needed to fill out
the install directory and the host
and that worked just fine.
I have filled out the ports correctly and the cluster is up and running and
works just fine.

Regards Erik

Re: Probelms getting Eclipse Hadoop plugin to work.

Posted by Iman <ie...@cs.uwaterloo.ca>.
This thread helped me fix a similar problem: 
http://mail-archives.apache.org/mod_mbox/hadoop-core-user/200807.mbox/%3CC001E847C1FD4248A7D6537643690E2101C8300D@mse16be2.mse16.exchange.ms%3E 


In my case, I had the ports specified in the hadoop-site.xml for the 
name node and job tracker switched in the Map/Reduce location's 
configuration.

Iman.
P.S. I sent this reply to the wrong thread before.
Erik Holstad wrote:
> Thanks guys!
> Running Linux and the remote cluster is also Linux.
> I have the properties set up like that already on my remote cluster, but
> not sure where to input this info into Eclipse.
> And when changing the ports to 9000 and 9001 I get:
>
> Error: java.io.IOException: Unknown protocol to job tracker:
> org.apache.hadoop.dfs.ClientProtocol....
>
> Regards Erik
>
>   


Re: Probelms getting Eclipse Hadoop plugin to work.

Posted by Rasit OZDAS <ra...@gmail.com>.
Erik, did you correctly placed ports in properties window?
Port 9000 under "Map/Reduce Master" on the left, 9001 under "DFS Master" on
the right.


2009/2/19 Erik Holstad <er...@gmail.com>

> Thanks guys!
> Running Linux and the remote cluster is also Linux.
> I have the properties set up like that already on my remote cluster, but
> not sure where to input this info into Eclipse.
> And when changing the ports to 9000 and 9001 I get:
>
> Error: java.io.IOException: Unknown protocol to job tracker:
> org.apache.hadoop.dfs.ClientProtocol....
>
> Regards Erik
>



-- 
M. Raşit ÖZDAŞ

Re: Probelms getting Eclipse Hadoop plugin to work.

Posted by Erik Holstad <er...@gmail.com>.
Thanks guys!
Running Linux and the remote cluster is also Linux.
I have the properties set up like that already on my remote cluster, but
not sure where to input this info into Eclipse.
And when changing the ports to 9000 and 9001 I get:

Error: java.io.IOException: Unknown protocol to job tracker:
org.apache.hadoop.dfs.ClientProtocol....

Regards Erik

Re: Probelms getting Eclipse Hadoop plugin to work.

Posted by Norbert Burger <no...@gmail.com>.
What platform are you running Eclipse on?  If Windows, see this thread
regarding Cygwin:

http://www.mail-archive.com/core-user@hadoop.apache.org/msg07669.html

For my case, I've never had to touch any of the plugin's advanced
parameters.  Usually, setting just the Map/Reduce Master and DFS Master (and
associated ports) has been sufficient.

Norbert

On 2/18/09, Erik Holstad <er...@gmail.com> wrote:
>
> I'm using Eclipse 3.3.2 and want to view my remote cluster using the Hadoop
> plugin.
> Everything shows up and I can see the map/reduce perspective but when
> trying
> to
> connect to a location I get:
> "Error: Call failed on local exception"
>
> I've set the host to for example xx0, where xx0 is a remote machine
> accessible from
> the terminal, and the ports to 50020/50040 for M/R master and
> DFS master respectively. Is there anything I'm missing to set for remote
> access to the
> Hadoop cluster?
>
> Regards Erik
>

Re: Probelms getting Eclipse Hadoop plugin to work.

Posted by Rasit OZDAS <ra...@gmail.com>.
Erik,
Try to add following properties into hadoop-site.xml:

        <property>
                <name>fs.default.name</name>
                <value>hdfs://<ip_address>:9000</value>
        </property>
        <property>
                <name>mapred.job.tracker</name>
                <value>hdfs://<ip_address>:9001</value>
        </property>

This way your ports become static. Then use port 9001 for MR, 9000 for HDFS
in your properties window.
If it still doesn't work, try to write ip address instead of host name as
target host.

Hope this helps,
Rasit

2009/2/18 Erik Holstad <er...@gmail.com>

> I'm using Eclipse 3.3.2 and want to view my remote cluster using the Hadoop
> plugin.
> Everything shows up and I can see the map/reduce perspective but when
> trying
> to
> connect to a location I get:
> "Error: Call failed on local exception"
>
> I've set the host to for example xx0, where xx0 is a remote machine
> accessible from
> the terminal, and the ports to 50020/50040 for M/R master and
> DFS master respectively. Is there anything I'm missing to set for remote
> access to the
> Hadoop cluster?
>
> Regards Erik
>



-- 
M. Raşit ÖZDAŞ