You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Thomas Anderson <t....@gmail.com> on 2011/02/23 13:57:31 UTC

Message in mapreduce class not logged

I have setup a cluster with several machines up and running. But I
encounter a problem that my mapper reducer class does not log.

The hadoop version I use is 0.20.2. The rootLogger in log4j.properties
adds DRFA (daily rolling file appender) and RFA (rolling file
appender).

My map reduce class looks as below

public class MyTest{
        public static final Log LOG = LogFactory.getLog(MyTest.class);

        public static class MiniMap extends Mapper<LongWritable, Text,
Text, Text>{
                private long count = 0;
                public void map(LongWritable key, Text value,
Mapper.Context ctx) throws IOException, InterruptedException{
                        String valueString = value.toString();
LOG.info("XXXXXXXXXXX value string obtained: "+valueString);
                        ctx.write(new LongWritable(counter++),
doSomething(valueString));
                }

        }
...
        static String fetch(String urlpath){
          ...
LOG.info("xxxxxxxxxxxxxx");
           ...
        }
}

The error shows 0java.lang.NullPointerException

    [java] 	at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:849)
     [java] 	at
org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:541)
     [java] 	at
org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
     [java] 	at myapp.MyTest.$MiniMap.map(MiniMap.java:51)
     [java] 	at myapp.MyTest.$MiniMap.map(MiniMap.java:44)

The error message points out the line that goes wrong, but the problem
is I do not see any message (in LOG.info(...)) get logged.

What else I need to turn on or modify so that log message can be seen
in e.g. hadoop/logs/hadoop-...log?

I appreciate any suggestion.

Re: Message in mapreduce class not logged

Posted by Thomas Anderson <t....@gmail.com>.
I notice that the job message indeed gets logged in the userlogs dir,
where contains sys* files.

Thanks for help.

On Wed, Feb 23, 2011 at 9:02 PM, Harsh J <qw...@gmail.com> wrote:
> The logs should appear in each of the ReduceTask's log. Are they not
> appearing in them?
>
> On Wed, Feb 23, 2011 at 6:27 PM, Thomas Anderson
> <t....@gmail.com> wrote:
>> I have setup a cluster with several machines up and running. But I
>> encounter a problem that my mapper reducer class does not log.
>>
>> The hadoop version I use is 0.20.2. The rootLogger in log4j.properties
>> adds DRFA (daily rolling file appender) and RFA (rolling file
>> appender).
>>
>> My map reduce class looks as below
>>
>> public class MyTest{
>>        public static final Log LOG = LogFactory.getLog(MyTest.class);
>>
>>        public static class MiniMap extends Mapper<LongWritable, Text,
>> Text, Text>{
>>                private long count = 0;
>>                public void map(LongWritable key, Text value,
>> Mapper.Context ctx) throws IOException, InterruptedException{
>>                        String valueString = value.toString();
>> LOG.info("XXXXXXXXXXX value string obtained: "+valueString);
>>                        ctx.write(new LongWritable(counter++),
>> doSomething(valueString));
>>                }
>>
>>        }
>> ...
>>        static String fetch(String urlpath){
>>          ...
>> LOG.info("xxxxxxxxxxxxxx");
>>           ...
>>        }
>> }
>>
>> The error shows 0java.lang.NullPointerException
>>
>>    [java]      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:849)
>>     [java]     at
>> org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:541)
>>     [java]     at
>> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>>     [java]     at myapp.MyTest.$MiniMap.map(MiniMap.java:51)
>>     [java]     at myapp.MyTest.$MiniMap.map(MiniMap.java:44)
>>
>> The error message points out the line that goes wrong, but the problem
>> is I do not see any message (in LOG.info(...)) get logged.
>>
>> What else I need to turn on or modify so that log message can be seen
>> in e.g. hadoop/logs/hadoop-...log?
>>
>> I appreciate any suggestion.
>>
>
>
>
> --
> Harsh J
> www.harshj.com
>

Re: Message in mapreduce class not logged

Posted by Harsh J <qw...@gmail.com>.
The logs should appear in each of the ReduceTask's log. Are they not
appearing in them?

On Wed, Feb 23, 2011 at 6:27 PM, Thomas Anderson
<t....@gmail.com> wrote:
> I have setup a cluster with several machines up and running. But I
> encounter a problem that my mapper reducer class does not log.
>
> The hadoop version I use is 0.20.2. The rootLogger in log4j.properties
> adds DRFA (daily rolling file appender) and RFA (rolling file
> appender).
>
> My map reduce class looks as below
>
> public class MyTest{
>        public static final Log LOG = LogFactory.getLog(MyTest.class);
>
>        public static class MiniMap extends Mapper<LongWritable, Text,
> Text, Text>{
>                private long count = 0;
>                public void map(LongWritable key, Text value,
> Mapper.Context ctx) throws IOException, InterruptedException{
>                        String valueString = value.toString();
> LOG.info("XXXXXXXXXXX value string obtained: "+valueString);
>                        ctx.write(new LongWritable(counter++),
> doSomething(valueString));
>                }
>
>        }
> ...
>        static String fetch(String urlpath){
>          ...
> LOG.info("xxxxxxxxxxxxxx");
>           ...
>        }
> }
>
> The error shows 0java.lang.NullPointerException
>
>    [java]      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:849)
>     [java]     at
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:541)
>     [java]     at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>     [java]     at myapp.MyTest.$MiniMap.map(MiniMap.java:51)
>     [java]     at myapp.MyTest.$MiniMap.map(MiniMap.java:44)
>
> The error message points out the line that goes wrong, but the problem
> is I do not see any message (in LOG.info(...)) get logged.
>
> What else I need to turn on or modify so that log message can be seen
> in e.g. hadoop/logs/hadoop-...log?
>
> I appreciate any suggestion.
>



-- 
Harsh J
www.harshj.com