You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Ted Yu <yu...@gmail.com> on 2010/07/21 02:03:10 UTC

Re: INFO: Task Id : attempt_201007191410_0002_m_000000_0, Status : FAILED

What hadoop version are you using ?

I guess you haven't specified io.serializations in your hadoop conf
Then by default your class should implement org.apache.hadoop.io.Writable

On Tue, Jul 20, 2010 at 6:01 AM, Khaled BEN BAHRI <
Khaled.Ben_bahri@it-sudparis.eu> wrote:

> hi
>
> When i wrote a mapreduce program
> i have this error
> please can any one help me
>
> Jul 19, 2010 5:06:31 PM org.apache.hadoop.mapred.JobClient
> monitorAndPrintJob
> INFO: Task Id : attempt_201007191410_0002_m_000000_0, Status : FAILED
> java.lang.NullPointerException
>        at
> org.apache.hadoop.io.serializer.SerializationFactory.getSerializer(SerializationFactory.java:73)
>        at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:797)
>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:350)
>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>        at org.apache.hadoop.mapred.Child.main(Child.java:170)
>
>
> Thanks in advance for help
>
> regards
>
>
>

Re: INFO: Task Id : attempt_201007191410_0002_m_000000_0, Status : FAILED

Posted by Aaron Kimball <aa...@cloudera.com>.
The most likely problem I suspect is that you're emitting a key or a value
to the OutputCollector that does not inherit from o.a.h.io.Writable. Your
input/output types should all do this. There are stock implementations
(IntWritable, LongWritable, FloatWritable, Text -- for strings, etc.) of all
the basic scalar types in the io package in Hadoop. You can, of course,
write your own Writable implementations if you need to use a complex
structure (tuples, etc.) as your intermediate key or value.

Note that you should use the JobConf.setMapOutputKeyClass() /
setMapOutputValueClass() methods to specify what types you're going to use,
too.

- Aaron

On Tue, Jul 20, 2010 at 5:03 PM, Ted Yu <yu...@gmail.com> wrote:

> What hadoop version are you using ?
>
> I guess you haven't specified io.serializations in your hadoop conf
> Then by default your class should implement org.apache.hadoop.io.Writable
>
>
> On Tue, Jul 20, 2010 at 6:01 AM, Khaled BEN BAHRI <
> Khaled.Ben_bahri@it-sudparis.eu> wrote:
>
>> hi
>>
>> When i wrote a mapreduce program
>> i have this error
>> please can any one help me
>>
>> Jul 19, 2010 5:06:31 PM org.apache.hadoop.mapred.JobClient
>> monitorAndPrintJob
>> INFO: Task Id : attempt_201007191410_0002_m_000000_0, Status : FAILED
>> java.lang.NullPointerException
>>        at
>> org.apache.hadoop.io.serializer.SerializationFactory.getSerializer(SerializationFactory.java:73)
>>        at
>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:797)
>>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:350)
>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
>>        at org.apache.hadoop.mapred.Child.main(Child.java:170)
>>
>>
>> Thanks in advance for help
>>
>> regards
>>
>>
>>
>