You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by "Eason.Lee" <le...@gmail.com> on 2009/09/18 02:49:19 UTC

Job Failue, Run it again? Or anything else to do?

I have run my job over night.
But i found that 4 of the maps failed for "java.lang.OutOfMemoryError: Java
heap space" exception.
Do I have to run all the job again? Or anything else I can do with it?

Re: Job Failue, Run it again? Or anything else to do?

Posted by "Eason.Lee" <le...@gmail.com>.
[?],but thanks all the same!

2009/9/18 Harish Mallipeddi <ha...@gmail.com>

> On Fri, Sep 18, 2009 at 6:19 AM, Eason.Lee <le...@gmail.com> wrote:
>
> > I have run my job over night.
> > But i found that 4 of the maps failed for "java.lang.OutOfMemoryError:
> Java
> > heap space" exception.
> > Do I have to run all the job again? Or anything else I can do with it?
> >
>
> If they failed due to OOM, they'll probably fail again. You might want to
> increase the heap-size. And yes if a job has failed, you've to restart the
> entire job.
>
> --
> Harish Mallipeddi
> http://blog.poundbang.in
>

Re: Job Failue, Run it again? Or anything else to do?

Posted by Harish Mallipeddi <ha...@gmail.com>.
On Fri, Sep 18, 2009 at 6:19 AM, Eason.Lee <le...@gmail.com> wrote:

> I have run my job over night.
> But i found that 4 of the maps failed for "java.lang.OutOfMemoryError: Java
> heap space" exception.
> Do I have to run all the job again? Or anything else I can do with it?
>

If they failed due to OOM, they'll probably fail again. You might want to
increase the heap-size. And yes if a job has failed, you've to restart the
entire job.

-- 
Harish Mallipeddi
http://blog.poundbang.in