You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Owen O'Malley (JIRA)" <ji...@apache.org> on 2007/10/15 19:20:50 UTC
[jira] Resolved: (HADOOP-2053) OutOfMemoryError : Java heap space
errors in hadoop 0.14
[ https://issues.apache.org/jira/browse/HADOOP-2053?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Owen O'Malley resolved HADOOP-2053.
-----------------------------------
Resolution: Fixed
I just committed this. Thanks, Arun!
> OutOfMemoryError : Java heap space errors in hadoop 0.14
> --------------------------------------------------------
>
> Key: HADOOP-2053
> URL: https://issues.apache.org/jira/browse/HADOOP-2053
> Project: Hadoop
> Issue Type: Bug
> Components: mapred
> Affects Versions: 0.14.0, 0.14.1, 0.14.2
> Reporter: lohit vijayarenu
> Assignee: Arun C Murthy
> Priority: Blocker
> Fix For: 0.14.3
>
> Attachments: HADOOP-2053_1_20071015.patch
>
>
> In recent hadoop 0.14 we are seeing few jobs where map taskf fail with java.lang.OutOfMemoryError: Java heap space problem
> These were the same jobs which used to work fine with 0.13
> <stack>
> task_200710112103_0001_m_000015_1: java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Arrays.java:2786)
> at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:94)
> at java.io.DataOutputStream.write(DataOutputStream.java:90)
> at org.apache.hadoop.io.Text.write(Text.java:243)
> at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:340)
> </stack>
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.