You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by web service <wb...@gmail.com> on 2010/10/19 18:03:19 UTC

Out of memory error.

I have a simple map-reduce program, which runs fine under eclipse. However
when I execute it using hadoop, it gives me an out of memory error.
Hadoop_heapsize is 2000MB

Not sure what the problem is.

Re: Out of memory error.

Posted by Juwei Shi <sh...@gmail.com>.
You should increase the heap size of the child JVM process running task
tracker rather than that of the process running job tracker. By default,
Hadoop allocates 1000 MB of memory to each daemon it runs. This is
controlled by the HADOOP_HEAPSIZE setting in hadoop-env.sh. Note that this
value is not for the child JVM to run map and reduce tasks.

The memory given to each of these child JVMs can be changed by setting the
mapred.child.java.opts property. The default setting is -Xmx200m, which
gives each task 200 MB of memory.

2010/10/20 Shrijeet Paliwal <sh...@rocketfuel.com>

> Where is it failing exactly? Map/Reduce tasks are failing or something
> else?
>
>
> On Tue, Oct 19, 2010 at 9:28 AM, Yin Lou <yi...@gmail.com> wrote:
>
>> Hi,
>>
>> You can increase heapsize by -D mapred.child.java.opts="-d64 -Xmx4096m"
>>
>> Hope it helps.
>> Yin
>>
>>
>> On Tue, Oct 19, 2010 at 12:03 PM, web service <wb...@gmail.com> wrote:
>>
>>> I have a simple map-reduce program, which runs fine under eclipse.
>>> However when I execute it using hadoop, it gives me an out of memory error.
>>> Hadoop_heapsize is 2000MB
>>>
>>> Not sure what the problem is.
>>>
>>
>>
>

Re: Out of memory error.

Posted by Shrijeet Paliwal <sh...@rocketfuel.com>.
Where is it failing exactly? Map/Reduce tasks are failing or something else?

On Tue, Oct 19, 2010 at 9:28 AM, Yin Lou <yi...@gmail.com> wrote:

> Hi,
>
> You can increase heapsize by -D mapred.child.java.opts="-d64 -Xmx4096m"
>
> Hope it helps.
> Yin
>
>
> On Tue, Oct 19, 2010 at 12:03 PM, web service <wb...@gmail.com> wrote:
>
>> I have a simple map-reduce program, which runs fine under eclipse. However
>> when I execute it using hadoop, it gives me an out of memory error.
>> Hadoop_heapsize is 2000MB
>>
>> Not sure what the problem is.
>>
>
>

Re: Out of memory error.

Posted by Yin Lou <yi...@gmail.com>.
Hi,

You can increase heapsize by -D mapred.child.java.opts="-d64 -Xmx4096m"

Hope it helps.
Yin

On Tue, Oct 19, 2010 at 12:03 PM, web service <wb...@gmail.com> wrote:

> I have a simple map-reduce program, which runs fine under eclipse. However
> when I execute it using hadoop, it gives me an out of memory error.
> Hadoop_heapsize is 2000MB
>
> Not sure what the problem is.
>