You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Raj hadoop <ra...@gmail.com> on 2014/02/24 11:08:20 UTC

heap space error

Hi All

                     In our Map reduce code, when we are giving more than
10 input sequence files, we are facing the java heap space error. Kindly
Find the attached screen shot for error and log file for failed task. The
program is working fine when number of input files is 10.

I tried to set number of Map Tasks to 10 from code and also input to jar
 but both are not working.

Alternatives tried :

   1.

   While running the map-reduce jar , providing the input as / -D
   mapred.map.tasks =10.
   2.

   In code I changed Job to JobConf to set num of map-tasks as :

   JobConf job = new JobConf(conf, SplitAutomation.class);

   job.setNumMapTasks(10);

Kindly help us as soon as possible as this is high priority.

Re: heap space error

Posted by Dieter De Witte <dr...@gmail.com>.
No problem, it's not easy to learn about all hadoop's configuration
options. Definitely consider looking into the reference (Tom White)


2014-02-24 11:20 GMT+01:00 Raj hadoop <ra...@gmail.com>:

> Thanks a ton Dieter
>
>
> On Mon, Feb 24, 2014 at 3:45 PM, Dieter De Witte <dr...@gmail.com>wrote:
>
>> You can configure the heap size of the mappers with the following
>> parameter (in mapred.site.xml)
>>
>> mapred.map.child.java.opts=-Xmx3200m
>>
>> Also setting the nummber of map tasks is not useful. You should set the
>> number of map slots per node:
>>
>> mapred.tasktracker.map.tasks.maximum=6
>>
>> Regards,
>> Dieter
>>
>>
>> 2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:
>>
>>  Hi All
>>>
>>>                      In our Map reduce code, when we are giving more
>>> than 10 input sequence files, we are facing the java heap space error.
>>> Kindly Find the attached screen shot for error and log file for failed
>>> task. The program is working fine when number of input files is 10.
>>>
>>> I tried to set number of Map Tasks to 10 from code and also input to jar
>>>  but both are not working.
>>>
>>> Alternatives tried :
>>>
>>>    1.
>>>
>>>    While running the map-reduce jar , providing the input as / -D
>>>    mapred.map.tasks =10.
>>>    2.
>>>
>>>    In code I changed Job to JobConf to set num of map-tasks as :
>>>
>>>    JobConf job = new JobConf(conf, SplitAutomation.class);
>>>
>>>    job.setNumMapTasks(10);
>>>
>>> Kindly help us as soon as possible as this is high priority.
>>>
>>>
>>>
>>>
>>>
>>
>

Re: heap space error

Posted by Dieter De Witte <dr...@gmail.com>.
No problem, it's not easy to learn about all hadoop's configuration
options. Definitely consider looking into the reference (Tom White)


2014-02-24 11:20 GMT+01:00 Raj hadoop <ra...@gmail.com>:

> Thanks a ton Dieter
>
>
> On Mon, Feb 24, 2014 at 3:45 PM, Dieter De Witte <dr...@gmail.com>wrote:
>
>> You can configure the heap size of the mappers with the following
>> parameter (in mapred.site.xml)
>>
>> mapred.map.child.java.opts=-Xmx3200m
>>
>> Also setting the nummber of map tasks is not useful. You should set the
>> number of map slots per node:
>>
>> mapred.tasktracker.map.tasks.maximum=6
>>
>> Regards,
>> Dieter
>>
>>
>> 2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:
>>
>>  Hi All
>>>
>>>                      In our Map reduce code, when we are giving more
>>> than 10 input sequence files, we are facing the java heap space error.
>>> Kindly Find the attached screen shot for error and log file for failed
>>> task. The program is working fine when number of input files is 10.
>>>
>>> I tried to set number of Map Tasks to 10 from code and also input to jar
>>>  but both are not working.
>>>
>>> Alternatives tried :
>>>
>>>    1.
>>>
>>>    While running the map-reduce jar , providing the input as / -D
>>>    mapred.map.tasks =10.
>>>    2.
>>>
>>>    In code I changed Job to JobConf to set num of map-tasks as :
>>>
>>>    JobConf job = new JobConf(conf, SplitAutomation.class);
>>>
>>>    job.setNumMapTasks(10);
>>>
>>> Kindly help us as soon as possible as this is high priority.
>>>
>>>
>>>
>>>
>>>
>>
>

Re: heap space error

Posted by Dieter De Witte <dr...@gmail.com>.
No problem, it's not easy to learn about all hadoop's configuration
options. Definitely consider looking into the reference (Tom White)


2014-02-24 11:20 GMT+01:00 Raj hadoop <ra...@gmail.com>:

> Thanks a ton Dieter
>
>
> On Mon, Feb 24, 2014 at 3:45 PM, Dieter De Witte <dr...@gmail.com>wrote:
>
>> You can configure the heap size of the mappers with the following
>> parameter (in mapred.site.xml)
>>
>> mapred.map.child.java.opts=-Xmx3200m
>>
>> Also setting the nummber of map tasks is not useful. You should set the
>> number of map slots per node:
>>
>> mapred.tasktracker.map.tasks.maximum=6
>>
>> Regards,
>> Dieter
>>
>>
>> 2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:
>>
>>  Hi All
>>>
>>>                      In our Map reduce code, when we are giving more
>>> than 10 input sequence files, we are facing the java heap space error.
>>> Kindly Find the attached screen shot for error and log file for failed
>>> task. The program is working fine when number of input files is 10.
>>>
>>> I tried to set number of Map Tasks to 10 from code and also input to jar
>>>  but both are not working.
>>>
>>> Alternatives tried :
>>>
>>>    1.
>>>
>>>    While running the map-reduce jar , providing the input as / -D
>>>    mapred.map.tasks =10.
>>>    2.
>>>
>>>    In code I changed Job to JobConf to set num of map-tasks as :
>>>
>>>    JobConf job = new JobConf(conf, SplitAutomation.class);
>>>
>>>    job.setNumMapTasks(10);
>>>
>>> Kindly help us as soon as possible as this is high priority.
>>>
>>>
>>>
>>>
>>>
>>
>

Re: heap space error

Posted by Dieter De Witte <dr...@gmail.com>.
No problem, it's not easy to learn about all hadoop's configuration
options. Definitely consider looking into the reference (Tom White)


2014-02-24 11:20 GMT+01:00 Raj hadoop <ra...@gmail.com>:

> Thanks a ton Dieter
>
>
> On Mon, Feb 24, 2014 at 3:45 PM, Dieter De Witte <dr...@gmail.com>wrote:
>
>> You can configure the heap size of the mappers with the following
>> parameter (in mapred.site.xml)
>>
>> mapred.map.child.java.opts=-Xmx3200m
>>
>> Also setting the nummber of map tasks is not useful. You should set the
>> number of map slots per node:
>>
>> mapred.tasktracker.map.tasks.maximum=6
>>
>> Regards,
>> Dieter
>>
>>
>> 2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:
>>
>>  Hi All
>>>
>>>                      In our Map reduce code, when we are giving more
>>> than 10 input sequence files, we are facing the java heap space error.
>>> Kindly Find the attached screen shot for error and log file for failed
>>> task. The program is working fine when number of input files is 10.
>>>
>>> I tried to set number of Map Tasks to 10 from code and also input to jar
>>>  but both are not working.
>>>
>>> Alternatives tried :
>>>
>>>    1.
>>>
>>>    While running the map-reduce jar , providing the input as / -D
>>>    mapred.map.tasks =10.
>>>    2.
>>>
>>>    In code I changed Job to JobConf to set num of map-tasks as :
>>>
>>>    JobConf job = new JobConf(conf, SplitAutomation.class);
>>>
>>>    job.setNumMapTasks(10);
>>>
>>> Kindly help us as soon as possible as this is high priority.
>>>
>>>
>>>
>>>
>>>
>>
>

Re: heap space error

Posted by Raj hadoop <ra...@gmail.com>.
Thanks a ton Dieter


On Mon, Feb 24, 2014 at 3:45 PM, Dieter De Witte <dr...@gmail.com> wrote:

> You can configure the heap size of the mappers with the following
> parameter (in mapred.site.xml)
>
> mapred.map.child.java.opts=-Xmx3200m
>
> Also setting the nummber of map tasks is not useful. You should set the
> number of map slots per node:
>
> mapred.tasktracker.map.tasks.maximum=6
>
> Regards,
> Dieter
>
>
> 2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:
>
>  Hi All
>>
>>                      In our Map reduce code, when we are giving more than
>> 10 input sequence files, we are facing the java heap space error. Kindly
>> Find the attached screen shot for error and log file for failed task. The
>> program is working fine when number of input files is 10.
>>
>> I tried to set number of Map Tasks to 10 from code and also input to jar
>>  but both are not working.
>>
>> Alternatives tried :
>>
>>    1.
>>
>>    While running the map-reduce jar , providing the input as / -D
>>    mapred.map.tasks =10.
>>    2.
>>
>>    In code I changed Job to JobConf to set num of map-tasks as :
>>
>>    JobConf job = new JobConf(conf, SplitAutomation.class);
>>
>>    job.setNumMapTasks(10);
>>
>> Kindly help us as soon as possible as this is high priority.
>>
>>
>>
>>
>>
>

Re: heap space error

Posted by Raj hadoop <ra...@gmail.com>.
Thanks a ton Dieter


On Mon, Feb 24, 2014 at 3:45 PM, Dieter De Witte <dr...@gmail.com> wrote:

> You can configure the heap size of the mappers with the following
> parameter (in mapred.site.xml)
>
> mapred.map.child.java.opts=-Xmx3200m
>
> Also setting the nummber of map tasks is not useful. You should set the
> number of map slots per node:
>
> mapred.tasktracker.map.tasks.maximum=6
>
> Regards,
> Dieter
>
>
> 2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:
>
>  Hi All
>>
>>                      In our Map reduce code, when we are giving more than
>> 10 input sequence files, we are facing the java heap space error. Kindly
>> Find the attached screen shot for error and log file for failed task. The
>> program is working fine when number of input files is 10.
>>
>> I tried to set number of Map Tasks to 10 from code and also input to jar
>>  but both are not working.
>>
>> Alternatives tried :
>>
>>    1.
>>
>>    While running the map-reduce jar , providing the input as / -D
>>    mapred.map.tasks =10.
>>    2.
>>
>>    In code I changed Job to JobConf to set num of map-tasks as :
>>
>>    JobConf job = new JobConf(conf, SplitAutomation.class);
>>
>>    job.setNumMapTasks(10);
>>
>> Kindly help us as soon as possible as this is high priority.
>>
>>
>>
>>
>>
>

Re: heap space error

Posted by Raj hadoop <ra...@gmail.com>.
Thanks a ton Dieter


On Mon, Feb 24, 2014 at 3:45 PM, Dieter De Witte <dr...@gmail.com> wrote:

> You can configure the heap size of the mappers with the following
> parameter (in mapred.site.xml)
>
> mapred.map.child.java.opts=-Xmx3200m
>
> Also setting the nummber of map tasks is not useful. You should set the
> number of map slots per node:
>
> mapred.tasktracker.map.tasks.maximum=6
>
> Regards,
> Dieter
>
>
> 2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:
>
>  Hi All
>>
>>                      In our Map reduce code, when we are giving more than
>> 10 input sequence files, we are facing the java heap space error. Kindly
>> Find the attached screen shot for error and log file for failed task. The
>> program is working fine when number of input files is 10.
>>
>> I tried to set number of Map Tasks to 10 from code and also input to jar
>>  but both are not working.
>>
>> Alternatives tried :
>>
>>    1.
>>
>>    While running the map-reduce jar , providing the input as / -D
>>    mapred.map.tasks =10.
>>    2.
>>
>>    In code I changed Job to JobConf to set num of map-tasks as :
>>
>>    JobConf job = new JobConf(conf, SplitAutomation.class);
>>
>>    job.setNumMapTasks(10);
>>
>> Kindly help us as soon as possible as this is high priority.
>>
>>
>>
>>
>>
>

Re: heap space error

Posted by Raj hadoop <ra...@gmail.com>.
Thanks a ton Dieter


On Mon, Feb 24, 2014 at 3:45 PM, Dieter De Witte <dr...@gmail.com> wrote:

> You can configure the heap size of the mappers with the following
> parameter (in mapred.site.xml)
>
> mapred.map.child.java.opts=-Xmx3200m
>
> Also setting the nummber of map tasks is not useful. You should set the
> number of map slots per node:
>
> mapred.tasktracker.map.tasks.maximum=6
>
> Regards,
> Dieter
>
>
> 2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:
>
>  Hi All
>>
>>                      In our Map reduce code, when we are giving more than
>> 10 input sequence files, we are facing the java heap space error. Kindly
>> Find the attached screen shot for error and log file for failed task. The
>> program is working fine when number of input files is 10.
>>
>> I tried to set number of Map Tasks to 10 from code and also input to jar
>>  but both are not working.
>>
>> Alternatives tried :
>>
>>    1.
>>
>>    While running the map-reduce jar , providing the input as / -D
>>    mapred.map.tasks =10.
>>    2.
>>
>>    In code I changed Job to JobConf to set num of map-tasks as :
>>
>>    JobConf job = new JobConf(conf, SplitAutomation.class);
>>
>>    job.setNumMapTasks(10);
>>
>> Kindly help us as soon as possible as this is high priority.
>>
>>
>>
>>
>>
>

Re: heap space error

Posted by Dieter De Witte <dr...@gmail.com>.
You can configure the heap size of the mappers with the following parameter
(in mapred.site.xml)

mapred.map.child.java.opts=-Xmx3200m

Also setting the nummber of map tasks is not useful. You should set the
number of map slots per node:

mapred.tasktracker.map.tasks.maximum=6

Regards,
Dieter


2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:

> Hi All
>
>                      In our Map reduce code, when we are giving more than
> 10 input sequence files, we are facing the java heap space error. Kindly
> Find the attached screen shot for error and log file for failed task. The
> program is working fine when number of input files is 10.
>
> I tried to set number of Map Tasks to 10 from code and also input to jar
>  but both are not working.
>
> Alternatives tried :
>
>    1.
>
>    While running the map-reduce jar , providing the input as / -D
>    mapred.map.tasks =10.
>    2.
>
>    In code I changed Job to JobConf to set num of map-tasks as :
>
>    JobConf job = new JobConf(conf, SplitAutomation.class);
>
>    job.setNumMapTasks(10);
>
> Kindly help us as soon as possible as this is high priority.
>
>
>
>
>

Re: heap space error

Posted by Dieter De Witte <dr...@gmail.com>.
You can configure the heap size of the mappers with the following parameter
(in mapred.site.xml)

mapred.map.child.java.opts=-Xmx3200m

Also setting the nummber of map tasks is not useful. You should set the
number of map slots per node:

mapred.tasktracker.map.tasks.maximum=6

Regards,
Dieter


2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:

> Hi All
>
>                      In our Map reduce code, when we are giving more than
> 10 input sequence files, we are facing the java heap space error. Kindly
> Find the attached screen shot for error and log file for failed task. The
> program is working fine when number of input files is 10.
>
> I tried to set number of Map Tasks to 10 from code and also input to jar
>  but both are not working.
>
> Alternatives tried :
>
>    1.
>
>    While running the map-reduce jar , providing the input as / -D
>    mapred.map.tasks =10.
>    2.
>
>    In code I changed Job to JobConf to set num of map-tasks as :
>
>    JobConf job = new JobConf(conf, SplitAutomation.class);
>
>    job.setNumMapTasks(10);
>
> Kindly help us as soon as possible as this is high priority.
>
>
>
>
>

Re: heap space error

Posted by Dieter De Witte <dr...@gmail.com>.
You can configure the heap size of the mappers with the following parameter
(in mapred.site.xml)

mapred.map.child.java.opts=-Xmx3200m

Also setting the nummber of map tasks is not useful. You should set the
number of map slots per node:

mapred.tasktracker.map.tasks.maximum=6

Regards,
Dieter


2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:

> Hi All
>
>                      In our Map reduce code, when we are giving more than
> 10 input sequence files, we are facing the java heap space error. Kindly
> Find the attached screen shot for error and log file for failed task. The
> program is working fine when number of input files is 10.
>
> I tried to set number of Map Tasks to 10 from code and also input to jar
>  but both are not working.
>
> Alternatives tried :
>
>    1.
>
>    While running the map-reduce jar , providing the input as / -D
>    mapred.map.tasks =10.
>    2.
>
>    In code I changed Job to JobConf to set num of map-tasks as :
>
>    JobConf job = new JobConf(conf, SplitAutomation.class);
>
>    job.setNumMapTasks(10);
>
> Kindly help us as soon as possible as this is high priority.
>
>
>
>
>

Re: heap space error

Posted by Dieter De Witte <dr...@gmail.com>.
You can configure the heap size of the mappers with the following parameter
(in mapred.site.xml)

mapred.map.child.java.opts=-Xmx3200m

Also setting the nummber of map tasks is not useful. You should set the
number of map slots per node:

mapred.tasktracker.map.tasks.maximum=6

Regards,
Dieter


2014-02-24 11:08 GMT+01:00 Raj hadoop <ra...@gmail.com>:

> Hi All
>
>                      In our Map reduce code, when we are giving more than
> 10 input sequence files, we are facing the java heap space error. Kindly
> Find the attached screen shot for error and log file for failed task. The
> program is working fine when number of input files is 10.
>
> I tried to set number of Map Tasks to 10 from code and also input to jar
>  but both are not working.
>
> Alternatives tried :
>
>    1.
>
>    While running the map-reduce jar , providing the input as / -D
>    mapred.map.tasks =10.
>    2.
>
>    In code I changed Job to JobConf to set num of map-tasks as :
>
>    JobConf job = new JobConf(conf, SplitAutomation.class);
>
>    job.setNumMapTasks(10);
>
> Kindly help us as soon as possible as this is high priority.
>
>
>
>
>