You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Amareshwari Sriramadasu (JIRA)" <ji...@apache.org> on 2009/05/25 13:13:45 UTC
[jira] Created: (HADOOP-5907) NegativeArraySizeException in reducer
with new api
NegativeArraySizeException in reducer with new api
--------------------------------------------------
Key: HADOOP-5907
URL: https://issues.apache.org/jira/browse/HADOOP-5907
Project: Hadoop Core
Issue Type: Bug
Components: mapred
Affects Versions: 0.20.0
Reporter: Amareshwari Sriramadasu
Fix For: 0.21.0
I observed one of the reducers failing with NegativeArraySizeException with new api.
The exception trace:
java.lang.NegativeArraySizeException
at org.apache.hadoop.io.BytesWritable.setCapacity(BytesWritable.java:119)
at org.apache.hadoop.io.BytesWritable.setSize(BytesWritable.java:98)
at org.apache.hadoop.io.BytesWritable.readFields(BytesWritable.java:153)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:67)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:40)
at org.apache.hadoop.mapreduce.ReduceContext.nextKeyValue(ReduceContext.java:142)
at org.apache.hadoop.mapreduce.ReduceContext.nextKey(ReduceContext.java:121)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:189)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:542)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:409)
at org.apache.hadoop.mapred.Child.main(Child.java:159)
The corresponding line in ReduceContext is
{code}
line#142 key = keyDeserializer.deserialize(key);
{code}
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.