You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Manish <ma...@gmail.com> on 2011/08/05 10:50:21 UTC

hadoop on fedora 15

Hi,

Has anybody been able to run hadoop standalone mode on fedora 15 ?
I have installed it correctly. It runs till map but gets stuck in reduce.
It fails with the error "mapred.JobClient Status : FAILED Too many
fetch-failures". I read several articles on net for this problem, all of them
say about the /etc/hosts and some say firewall issue.
I enabled firewall for the port range and also checked my /etc/hosts file. its
content is "localhost" and that is the only line in it.

Is sun-java absolute necessary or open-jdk will work ?

can someone give me some suggestion to get along with this problem ? 

Thanks & regard

Manish


Re: hadoop on fedora 15

Posted by Marcos Ortiz <ml...@uci.cu>.

On 04/26/2012 01:49 AM, john cohen wrote:
> I had the same issue.  My problem was the use of VPN
> connected to work, and at the same time working
> with M/R jobs on my Mac.  It occurred to me that
> maybe Hadoop was binding to the wrong IP (the IP
> given to you after connecting through VPN),
> bottom line, I disconnect from the VPN, and the M/R job
> finished as expected after that.
>
This is logic because, after that you configure to connect to the VPN, 
your machines have
other IPs, based on the request of the private network. You can test, 
changing the IPs for the
new ones based on the VPN request.

-- 
Marcos Luis Ortíz Valmaseda (@marcosluis2186)
  Data Engineer at UCI
  http://marcosluis2186.posterous.com



10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci

Re: hadoop on fedora 15

Posted by john cohen <jo...@nokia.com>.
I had the same issue.  My problem was the use of VPN
connected to work, and at the same time working
with M/R jobs on my Mac.  It occurred to me that 
maybe Hadoop was binding to the wrong IP (the IP 
given to you after connecting through VPN),  
bottom line, I disconnect from the VPN, and the M/R job 
finished as expected after that.


Re: hadoop on fedora 15

Posted by Harsh J <ha...@cloudera.com>.
Sun JDK is what its been thoroughly tested upon. You can run on
OpenJDK perhaps, but YMMV.

Hadoop has a strict requirement of having a proper network setup before use.

What port range did you open? TaskTracker would use 50060 for
intercommunication (over lo, if its bound to that). Check if your
daemons are binding to the right interfaces and have proper name->IP
resolutions, and then check if that port is allowed communication
upon.

On Fri, Aug 5, 2011 at 2:20 PM, Manish <ma...@gmail.com> wrote:
> Hi,
>
> Has anybody been able to run hadoop standalone mode on fedora 15 ?
> I have installed it correctly. It runs till map but gets stuck in reduce.
> It fails with the error "mapred.JobClient Status : FAILED Too many
> fetch-failures". I read several articles on net for this problem, all of them
> say about the /etc/hosts and some say firewall issue.
> I enabled firewall for the port range and also checked my /etc/hosts file. its
> content is "localhost" and that is the only line in it.
>
> Is sun-java absolute necessary or open-jdk will work ?
>
> can someone give me some suggestion to get along with this problem ?
>
> Thanks & regard
>
> Manish
>
>



-- 
Harsh J

Re: hadoop on fedora 15

Posted by madhu phatak <ph...@gmail.com>.
disable iptables and try again

On Fri, Aug 5, 2011 at 2:20 PM, Manish <ma...@gmail.com> wrote:

> Hi,
>
> Has anybody been able to run hadoop standalone mode on fedora 15 ?
> I have installed it correctly. It runs till map but gets stuck in reduce.
> It fails with the error "mapred.JobClient Status : FAILED Too many
> fetch-failures". I read several articles on net for this problem, all of
> them
> say about the /etc/hosts and some say firewall issue.
> I enabled firewall for the port range and also checked my /etc/hosts file.
> its
> content is "localhost" and that is the only line in it.
>
> Is sun-java absolute necessary or open-jdk will work ?
>
> can someone give me some suggestion to get along with this problem ?
>
> Thanks & regard
>
> Manish
>
>


-- 
Join me at http://hadoopworkshop.eventbrite.com/