You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Mapred Learn <ma...@gmail.com> on 2012/07/29 10:17:14 UTC
how to set huge memory for reducer in streaming
hi,
One of my programs create a huge python dictionary and reducers fails with
Memory Error everytime.
Is there a way to specify reducer memory to be a bigger value for reducers
to succeed ?
I know we shuold not have this requirement in first place and not cerate
this kind of dictionary, but still can I finish this job with giving more
memory in jar command ?
Thanks,
JJ
Re: how to set huge memory for reducer in streaming
Posted by Harsh J <ha...@cloudera.com>.
ML,
Upto what mapred.child.ulimit values have you tried submitting with?
How large of a dict do you build in your program?
On Sun, Jul 29, 2012 at 1:54 PM, Mapred Learn <ma...@gmail.com> wrote:
> + CDH users
>
> Sent from my iPhone
>
> On Jul 29, 2012, at 1:17 AM, Mapred Learn <ma...@gmail.com> wrote:
>
>> hi,
>> One of my programs create a huge python dictionary and reducers fails with Memory Error everytime.
>>
>> Is there a way to specify reducer memory to be a bigger value for reducers to succeed ?
>>
>> I know we shuold not have this requirement in first place and not cerate this kind of dictionary, but still can I finish this job with giving more memory in jar command ?
>>
>>
>> Thanks,
>> JJ
>>
--
Harsh J
Re: how to set huge memory for reducer in streaming
Posted by Mapred Learn <ma...@gmail.com>.
+ CDH users
Sent from my iPhone
On Jul 29, 2012, at 1:17 AM, Mapred Learn <ma...@gmail.com> wrote:
> hi,
> One of my programs create a huge python dictionary and reducers fails with Memory Error everytime.
>
> Is there a way to specify reducer memory to be a bigger value for reducers to succeed ?
>
> I know we shuold not have this requirement in first place and not cerate this kind of dictionary, but still can I finish this job with giving more memory in jar command ?
>
>
> Thanks,
> JJ
>
Re: how to set huge memory for reducer in streaming
Posted by Mapred Learn <ma...@gmail.com>.
Hi Harsh,
I tried all these but still fails.
Sent from my iPhone
On Jul 29, 2012, at 1:23 AM, Harsh J <ha...@cloudera.com> wrote:
> Hi,
>
> You may raise your heap size via mapred.child.java.opts (or
> mapred.reduce.child.java.opts for reducers alone), and further raise
> the virtual-mem
> limit via mapred.child.ulimit (try setting it to 2x or 3x the heap
> size, in KB, or higher). I think its the latter you're running out
> with, since there's a subprocess involved.
>
> Let us know if that helps.
>
> On Sun, Jul 29, 2012 at 1:47 PM, Mapred Learn <ma...@gmail.com> wrote:
>> hi,
>> One of my programs create a huge python dictionary and reducers fails with
>> Memory Error everytime.
>>
>> Is there a way to specify reducer memory to be a bigger value for reducers
>> to succeed ?
>>
>> I know we shuold not have this requirement in first place and not cerate
>> this kind of dictionary, but still can I finish this job with giving more
>> memory in jar command ?
>>
>>
>> Thanks,
>> JJ
>>
>
>
>
> --
> Harsh J
Re: how to set huge memory for reducer in streaming
Posted by Harsh J <ha...@cloudera.com>.
Hi,
You may raise your heap size via mapred.child.java.opts (or
mapred.reduce.child.java.opts for reducers alone), and further raise
the virtual-mem
limit via mapred.child.ulimit (try setting it to 2x or 3x the heap
size, in KB, or higher). I think its the latter you're running out
with, since there's a subprocess involved.
Let us know if that helps.
On Sun, Jul 29, 2012 at 1:47 PM, Mapred Learn <ma...@gmail.com> wrote:
> hi,
> One of my programs create a huge python dictionary and reducers fails with
> Memory Error everytime.
>
> Is there a way to specify reducer memory to be a bigger value for reducers
> to succeed ?
>
> I know we shuold not have this requirement in first place and not cerate
> this kind of dictionary, but still can I finish this job with giving more
> memory in jar command ?
>
>
> Thanks,
> JJ
>
--
Harsh J