You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Harish Mallipeddi <ha...@gmail.com> on 2008/04/23 09:11:05 UTC

Re: reducer outofmemoryerror

Memory settings are in conf/hadoop-default.xml. You can override them in
conf/hadoop-site.xml.

Specifically I think you would want to change mapred.child.java.opts

On Wed, Apr 23, 2008 at 2:40 PM, Apurva Jadhav <aj...@thefind.com> wrote:

> Hi,
>  I have a 4 node hadoop 0.15.3 cluster. I am using the default config
> files. I am running a map reduce job to process 40 GB log data.
> Some reduce tasks are failing with the following errors:
> 1)
> stderr
> Exception in thread "org.apache.hadoop.io.ObjectWritable Connection
> Culler" Exception in thread
> "org.apache.hadoop.dfs.DFSClient$LeaseChecker@1b3f8f6"
> java.lang.OutOfMemoryError: Java heap space
> Exception in thread "IPC Client connection to /127.0.0.1:34691"
> java.lang.OutOfMemoryError: Java heap space
> Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
>
> 2)
> stderr
> Exception in thread "org.apache.hadoop.io.ObjectWritable Connection
> Culler" java.lang.OutOfMemoryError: Java heap space
>
> syslog:
> 2008-04-22 19:32:50,784 INFO org.apache.hadoop.mapred.ReduceTask:
> task_200804212359_0007_r_000004_0 Merge of the 19 files in
> InMemoryFileSystem complete. Local file is /data/hadoop-im2/mapred/loca
> l/task_200804212359_0007_r_000004_0/map_22600.out
> 2008-04-22 20:34:16,012 INFO org.apache.hadoop.ipc.Client:
> java.net.SocketException: Socket closed
>       at java.net.SocketInputStream.read(SocketInputStream.java:162)
>       at java.io.FilterInputStream.read(FilterInputStream.java:111)
>       at org.apache.hadoop.ipc.Client$Connection$1.read(Client.java:181)
>       at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
>       at java.io.BufferedInputStream.read(BufferedInputStream.java:235)
>       at java.io.DataInputStream.readInt(DataInputStream.java:353)
>       at org.apache.hadoop.ipc.Client$Connection.run(Client.java:258)
>
> 2008-04-22 20:34:16,032 WARN org.apache.hadoop.mapred.TaskTracker: Error
> running child
> java.lang.OutOfMemoryError: Java heap space
> 2008-04-22 20:34:16,031 INFO org.apache.hadoop.mapred.TaskRunner:
> Communication exception: java.lang.OutOfMemoryError: Java heap space
>
> Has anyone experienced similar problem ?   Is there any configuration
> change that can help resolve this issue.
>
> Regards,
> aj
>
>
>
>


-- 
Harish Mallipeddi
circos.com : poundbang.in/blog/