You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Ted Dunning <td...@veoh.com> on 2007/07/01 23:10:48 UTC

Re: OutOfMemory

If you are using machines with only 512MB of memory, it is probably a very
bad idea to set minimum help size so large.

-Xms400M might be more appropriate.

I should say, though that if you have a program that is worth using hadoop
on, you have a problem that is worth having more memory on each processor.
Most of the work I do benefits more from memory than from processor, at
least up to >1-2GB RAM.

On 6/30/07 11:51 AM, "Avinash Lakshman" <al...@facebook.com> wrote:

> There is an element in the config for Java params. Set it to -Xms1024M
> and give it a shot. It is definitely seems like a case of you running
> out of heap space.
> 
> A
> -----Original Message-----
> From: Emmanuel JOKE [mailto:jokeout@gmail.com]
>  ...
> My cluster of 2 machines used each 512 M0 of memory. isn't it enough ?
> What is the best practice ?
> 
> Do you any idea if they are a bug ? or is it just my conf which is not
> correct ?
> 
> Thanks for your help